# Polish language optimization

Pllum 8x7B Chat GGUF
Apache-2.0
The GGUF quantization version of PLLuM-8x7B-chat, optimized for local inference, supporting multiple quantization levels to meet different hardware requirements.
Large Language Model Transformers
P
piotrmaciejbednarski
126
2
Llama PLLuM 8B Chat
PLLuM is a large - language - model family focused on Polish and other Slavic/Baltic languages, while incorporating English data to achieve broader generalization ability.
Large Language Model Transformers Other
L
CYFRAGOVPL
2,618
3
Pllum 12B Nc Chat
PLLuM-12B-chat is an optimized dialogue version with 12 billion parameters in the Polish large language model family. It is specifically designed for the Polish language and Slavic/Baltic languages, achieving safe and efficient interaction capabilities through instruction fine-tuning and preference learning.
Large Language Model Transformers
P
CYFRAGOVPL
2,673
6
Qra 1b
Apache-2.0
Qra is a series of Polish-optimized large language models jointly developed by the Polish National Information Processing Institute and Gdańsk University of Technology, initialized based on TinyLlama-1.1B and trained on 90 billion Polish tokens
Large Language Model Transformers
Q
OPI-PG
246
20
Polka 1.1b
Apache-2.0
polka-1.1b is a bilingual (Polish and English) text generation model enhanced by continuing pre-training on 5.7 billion Polish tokens based on the TinyLlama-1.1B model.
Large Language Model Transformers Supports Multiple Languages
P
eryk-mazus
174
8
Xtreme S Xlsr Mls Upd
Apache-2.0
A Polish speech recognition model fine-tuned on the GOOGLE/XTREME_S - MLS.PL dataset based on facebook/wav2vec2-xls-r-300m
Speech Recognition Transformers Other
X
anton-l
16
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase